Neural network approximation: Three hidden layers are enough

نویسندگان

چکیده

A three-hidden-layer neural network with super approximation power is introduced. This built the floor function (⌊x⌋), exponential (2x), step (1x≥0), or their compositions as activation in each neuron and hence we call such networks Floor-Exponential-Step (FLES) networks. For any width hyper-parameter N∈N+, it shown that FLES max{d,N} three hidden layers can uniformly approximate a Hölder continuous f on [0,1]d an rate 3λ(2d)α2−αN, where α∈(0,1] λ>0 are order constant, respectively. More generally for arbitrary modulus of continuity ωf(⋅), constructive 2ωf(2d)2−N+ωf(2d2−N). Moreover, extend result to general bounded functions set E⊆Rd. As consequence, this new class overcomes curse dimensionality when variation ωf(r) r→0 moderate (e.g., ωf(r)≲rα functions), since major term be concerned our essentially d times N independent within continuity. Finally, analysis derive similar results Lp-norm p∈[1,∞) via replacing by functions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dual Nature Hidden Layers Neural Networks

We present here a new scheme to construct a neural network architecture based on the physiological properties biological neuron for enhancing its performance. The new scheme divides every hidden layer into two parts to facilitate the processing of 0 and 1 separately and reduces the total number of interconnections considerably. The first part consist of units that receives signals only from ‘1 ...

متن کامل

Bayesian adaptation of hidden layers in Boolean feedforward neural networks

In this paper a statistical point of view of feedforwared neural networks is presented. The hidden layer of a multilayer perceptrokneural network is identified of representing the mapping of random vectors. Utilizing hard limiter activation functions, the second and all further layers of the multilayer perceptron, including the output layer; represent the mapping of a boolean function. Boolean ...

متن کامل

Multilayer Neural Networks: One or Two Hidden Layers?

We study the number of hidden layers required by a multilayer neu-ral network with threshold units to compute a function f from n d to {O, I}. In dimension d = 2, Gibson characterized the functions computable with just one hidden layer, under the assumption that there is no "multiple intersection point" and that f is only defined on a compact set. We consider the restriction of f to the neighbo...

متن کامل

Approximation with Diffusion-Neural-Network

Neural information processing models largely assume that the samples for training a neural network are sufficient. Otherwise there exist a non-negligible error between the real function and estimated function from a trained network. To reduce the error in this paper we suggest a diffusion-neural-network (DNN) to learn from a small sample. First, we show the principle of information diffusion us...

متن کامل

Three Layers Approach for Network Scanning Detection

Computer networks became one of the most important dimensions in any organization. This importance is due to the connectivity benefits that can be given by networks, such as computing power, data sharing and enhanced performance. However using networks comes with a cost, there are some threats and issues that need to be addressed, such as providing sufficient level of security. One of the most ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Networks

سال: 2021

ISSN: ['1879-2782', '0893-6080']

DOI: https://doi.org/10.1016/j.neunet.2021.04.011